% scribe: Cristine Pinto % lastupdate: 26 October 2005 % lecture: 16 % references: Durrett, Section 2.3 % title: Continuity Theorem for Characteristic Functions % keywords: characteristic functions, inversion formula, inversion formula of characteristic function, Helly's selection theorem, tightness, determining class, uniqueness theorem, Continuity Theorem % end \documentclass[12pt, letterpaper]{article} \include{macros} \def\vcv{\stackrel{\scriptscriptstyle v}{\longrightarrow}} % vague convergnece \begin{document} \lecture{16}{Continuity Theorem for Characteristic Functions} {Cristine Pinto}{cristine@econ.berkeley.edu} References: \cite{durrett}, section 2.3. \section{Review of the Inversion Formula} % Keywords: characteristic functions, inversion formula, inversion formula of characteristic function % end Recall that if $X$ is a random variable, it's characteristic function is \[\varphi_X(t) =\E \left[ e^{itX} \right].\] In the last lecture, we proved the Uniqueness Theorem for characteristic functions. We also learned the \emph{inversion formula}: if $\int |\varphi_{X}(t)| < \infty$, then X has a bounded continuous density % \[f_X(x)=\frac{1}{2\pi}\int e^{-itx}\varphi_{X}(t)dt.\] % \section{Continuity Theorem for Characteristic Functions} Suppose we have a sequence of distributions on the line $(\P_n)$ with characteristic functions % \[\varphi_n(t)=\int e^{itx} \cdot\P_n(dx)=\E \left[ e^{itX_n} \right]\] % for $X_n \sim \P_n$. We want to be able to tell that $\P_n$ converges in distribution to some limit on $\R$ by looking at $\varphi_X(t)$. Recall that if $\P_n \dcv \P$ then % \[\int f d\P_n\longrightarrow \int f d\P \mbox{ for every bounded continuous function } f.\] % So in particular for $f(x)=e^{itx}$, we get $\varphi_n(t)\longrightarrow\varphi(t)$ as $\ n\longrightarrow\infty$ where $\varphi(t)=\int e^{itx} \P(dx)$, the characteristic function of $\P$. Consider the converse. Suppose we have a sequence $\P_n$ with $\varphi_n(t)$ and $\varphi_n(t)\longrightarrow\varphi(t)$ as $n\longrightarrow\infty$ for some function $\varphi(t)$. Without imposing further assumptions, we cannot conclude that $\P_n \dcv \P$ where $\P$ has a continuous characteristic function $\varphi(t)$. \begin{example} Let $\P_n=N(0,n)$, so that $\varphi_n(t)=e^{-\frac{1}{2}n{t^2}}$, $t\in\R$. Notice that $\varphi_n(t)\longrightarrow \1(t=0)$, but \[\P_n\vcv\frac{1}{2}\delta_{-\infty}+\frac{1}{2}\delta_{+\infty}.\] \end{example} Recall the idea of $\emph{tightness}$: $(\P_n)$ is tight if: \[\lim_{x \rightarrow \infty}\sup_n\P_n(-x,x)^c=0.\] We present the first (easy) version of the continuity theorem for characteristic functions. \begin{theorem} \label{easyctythm} Let $\P_n$ be probability measures on $\R$ with c.f.\ $\varphi_n$. If: \begin{enumerate} \item $\lim_{n \rightarrow \infty}\varphi_n(t)=\varphi(t)$ exists for every $t\in\R$; and \item $\P_n$ is tight, \end{enumerate} then $\P_n\longrightarrow \P$ where $\P$ is a probability measure on $\R$ with c.f.\ $\varphi$. \end{theorem} \begin{proof} By Helly's selection theorem, to prove that there exists a $\P$ such that $\P_n\longrightarrow \P$, it is enough to show that there exists a $\P$ such that every subsequence of $(\P_n)$ has a further subsequence which converges to $\P$. To find a suitable $\P$, recall the general theorem: Let $\mathcal C$ be a collection of bounded continuous functions which is determining. If: \begin{enumerate} \item $\lim_{n \rightarrow \infty}\int f d\P_n$ exists for all $f\in\mathcal C$; and \item $(\P_n)$ is tight, \end{enumerate} then: \[\P_n\dcv \P \mbox{ where } \int f d\P=\lim_{n \rightarrow \infty}\int f d\P_n,\, \forall f\in\mathcal C\] Apply this general theorem to \[ \mathcal C = \left\{ f \mbox{ of the form } f(x) = \sin(tx), \,t \in \R \mbox{ or } f(x)=\cos(tx), \, t \in \R \right\}. \] $\mathcal C$ is determining by the uniqueness theorem for c.f.'s. \end{proof} This form of the continuity theorem is adequate for most applications, such as the CLT. Usually in the CLT we try to show: \[Z_n=\frac{S_n}{\sqrt{\E(S_n^2)}}\dcv N(0,1)\] In this case, $\P_n$ is the distribution of $Z_n$ with $\E(Z_n)=0$. Clearly, $\P_n$ is tight: \[\P_n(-x,x)^c=\P(|Z_n|>x)\leq\frac {\E(Z_n^2)}{x^2}=\frac{1}{x^2},\] which decreases to $0$ as $x \uparrow \infty$. So, if we continue to assume condition 1 of theorem \ref{easyctythm}, condition 2 implies that there exists some $\P$ with c.f.\ $\varphi$ such that: \begin{enumerate} \item[(2a)] $\varphi$ is the characteristic function of \emph{some} distribution. This also implies \item[(2b)] $\varphi$ is continuous as a function of $t$. This, in turn, implies \item[(2c)] $t\longrightarrow\varphi(t)$ is continuous at $t=0$. \end{enumerate} Paul L\'evy found that with condition 1, (2b) is \emph{equivalent} to condition 2 of theorem \ref{easyctythm}. \begin{theorem}(L\'evy Continuity Theorem for c.f.'s): Given $\P_n$ with c.f. $\varphi_n$, if: \begin{enumerate} \item $lim_{n \rightarrow \infty} \varphi_n(t)=\varphi(t)$ exists for all $ t\in\R$; and \item $t\longrightarrow\varphi(t)$ is continuous at $t=0$, \end{enumerate} then \[\P_n\dcv\P \mbox{ with } \int e^{itx} \,\P(dx)= \varphi(t).\] \end{theorem} \begin{proof} In the proof, we just need to show that continuity of $\varphi$ at $0$ implies $(\P_n)$ is tight. For that, it's most convincing to use a genuine bound (\cite{durrett}, p.\ 98): % \[\frac{1}{u}\cdot\int^{-u}_{u}[1-\varphi_n(t)]dt \geq\P_n\left(\frac{-2}{u},\frac{2}{u}\right)^c\] % Recall that $\varphi_n(t)\longrightarrow\varphi(t)$ as $n\longrightarrow\infty$ and $\varphi_n(t)$ is continuous at $t=0$, $\varphi(0)=1$. As $n\longrightarrow\infty$, then, \[\frac{1}{u}\int^{u}_{-u}[1-\varphi_n(t)]dt \longrightarrow\frac{1}{u}\int^{u}_{-u}[1-\varphi(t)] dt,\] and as $u\longrightarrow 0$, \[\frac{1}{u}\int^{u}_{-u}[1-\varphi(t)] dt\longrightarrow 0.\] Fix $\epsilon>0$ and choose $u$ small enough so that $\frac{1}{u}\int^{u}_{-u}[1-\varphi(t)]dt< \epsilon$. Choose $N$ large enough that: \[\frac{1}{u}\int^{u}_{-u}|1-\varphi_n(t)|dt<2\epsilon \mbox{ for } n\geq N.\] Now we have \[\P_n\left(-\frac{2}{u},\frac{2}{u}\right)^c\leq2\epsilon \mbox{ for all } n\geq N,\] and hence $\lim_{x\rightarrow \infty}\sup_{n} \P_n(-x,x)^c=0$ as desired. \end{proof} \section{Exercises} \begin{exercise} [Extra credit problem] \label{extracreditprob} Suppose a sequence $(\P_n)$ of probability measure on $\R$ such that $\lim_{n \rightarrow \infty}\int f d\P_n$ exists and $\in\R$ for every bounded continuous $f$. Then (you check): there exists a unique probability measure $\P$ such that $\P_n\dcv\P$. Consequently: \[\int f d\P=\lim_{n \rightarrow \infty}\int f d\P_n.\] \end{exercise} \begin{exercise} What happens if we replace $\R$ in exercise \ref{extracreditprob} by $\R^n$ or a generic metric space? \end{exercise} \begin{example}[Related to the exercises above] Consider: \[\mathcal C = \left\{f:f \mbox{ is bounded, continuous, and has a compact support}\right\}\] (Note that if $f$ has compact support, then $f(x)=0$ for $|x|>B$ for some $B\geq 0$.) Check that $\mathcal C$ is a determining class. Consider \[\mathcal C_0 =\left\{f:f(0)=0 \mbox{ and $f$ is continuous with compact support}\right\};\] $\mathcal C_0$ is also a determining class. Let \begin{equation*} \P_n = \left\{ \begin{array}{ll} \delta_n & \mbox{ if $n$ is even}\\ \delta_0 & \mbox{ if $n$ is odd} \end{array}\right. \end{equation*} In this case, $\int f d\P_n \longrightarrow \int f d\delta_0$ for all $f\in\mathcal C$, but $\P_n \nrightarrow \P_0$. The moral of this example is that to prove that $\P_n\dcv\P$ for some probability $\P$ on $\R$, it is not enough to just show that $\int f d\P_n\longrightarrow\int f d\P$ for all $f$ in a determining class. \end{example} \bibliographystyle{plain} \bibliography{../books} \end{document}